BE/BAT 485/585 - Lab-6
Ex-1
Danielle Tadych

In this exercise you will learn and understand

  1. How hyperspectral differs from Multispectral
  2. How to Perform Spectral and Spatial Convolution using a sensor RSR information
  3. How to simulate multispectral data from hyperspectral data
  4. How to create data for ground truthing
  5. And finally you will learn how to use the HDF library to open HDF(5) files. Most remote sensing is stored in this format

Convolution is a form of simulating multispectral data from hyperspectral data via the Relative Spectral Responses (RSR). RSRs define the shape of the multispectral sensors and resulting data.


The figure above defines how much each portion of the spectrum contributes to the coarse spectral band.
With this info we can use high definition hyperspectral data to generate or simulate any sensor's spectral band

For example to generate the NIR band (above) we integrate all the hyperspectral data in the range defined by the RSR of the NIR band as would the actual multispectral sensor do.

The spectral convolution process uses the following equation:

$ L_b= \frac { { \int \limits _{\lambda_1} ^{\lambda_2}} L_h.SRF.d { \lambda} } { { \int \limits _{\lambda_1} ^{\lambda_2}} SRF.d { \lambda} } $

Where:

  1. $\lambda_1$ & $\lambda_2$ are the band's spectral range
  2. SRF is the RSR value for the portion of the spectrum
  3. d$\lambda$ is a hyperspectral step
  4. L$_b$ is the multispectral radiance/reflectance computed
  5. L$_h$ is the radiance/reflectance for the considered range

The spatial convolution process uses the following equation:

$ V = \frac { 1}{ n.m } \sum \limits _{i=1} ^{n} \sum \limits _{j=1} ^{m} v_{ij} $

Where:

  1. V is the new pixel (usually coarser) brightness value
  2. v$_i$$_j$ is the finer resoltuion pixel brightness value
  3. n and m define the matrix size in the finer resolution image that would correspond to a single pixel in the coarser resoltuion image

The spatial convolution is an aggregation process of the finer resolution pixles to the coarser ones. For example to create a 250m resolution pixel from 1 m pixels we simply need to average 250x250 pixels window. In practice there are other ways to doing this, that we will cover later.

In [1]:
# load library modules 
import os
import numpy as np
import matplotlib.pyplot as plt
import viplab_lib3 as vip
# convolution library
import viplab_convolution as vipConv

First let's import the required libraries/packages¶

In [4]:
import h5py  
import io
from PIL import Image

Set display preferences so that plots are inline (meaning any images you output from your code will show up below the cell in the notebook) and turn off plot warnings (debug)¶

In [5]:
# This code forces plotting inline within the code (below the cell) and not outsoide in separate windows
%matplotlib inline
import warnings
warnings.filterwarnings('ignore')

Read hdf5 using the pyhdf package (if you do not have it install it)¶

To install this package with conda run¶

conda install -c conda-forge pyhdf¶

In [8]:
conda install -c conda-forge pyhdf
Collecting package metadata (current_repodata.json): done
Solving environment: done


==> WARNING: A newer version of conda exists. <==
  current version: 4.11.0
  latest version: 4.12.0

Please update conda by running

    $ conda update -n base -c defaults conda



## Package Plan ##

  environment location: /Users/condongroup/opt/miniconda3/envs/remsens

  added / updated specs:
    - pyhdf


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    certifi-2021.10.8          |   py38h50d1736_2         145 KB  conda-forge
    openssl-1.1.1n             |       h6c3fc93_0         1.9 MB  conda-forge
    pyhdf-0.10.3               |   py38he785675_1         145 KB  conda-forge
    python_abi-3.8             |           2_cp38           4 KB  conda-forge
    ------------------------------------------------------------
                                           Total:         2.2 MB

The following NEW packages will be INSTALLED:

  hdf4               conda-forge/osx-64::hdf4-4.2.15-hefd3b78_3
  pyhdf              conda-forge/osx-64::pyhdf-0.10.3-py38he785675_1
  python_abi         conda-forge/osx-64::python_abi-3.8-2_cp38

The following packages will be SUPERSEDED by a higher-priority channel:

  ca-certificates    pkgs/main::ca-certificates-2022.3.29-~ --> conda-forge::ca-certificates-2021.10.8-h033912b_0
  certifi            pkgs/main::certifi-2021.10.8-py38hecd~ --> conda-forge::certifi-2021.10.8-py38h50d1736_2
  openssl              pkgs/main::openssl-1.1.1n-hca72f7f_0 --> conda-forge::openssl-1.1.1n-h6c3fc93_0



Downloading and Extracting Packages
openssl-1.1.1n       | 1.9 MB    | ##################################### | 100% 
python_abi-3.8       | 4 KB      | ##################################### | 100% 
certifi-2021.10.8    | 145 KB    | ##################################### | 100% 
pyhdf-0.10.3         | 145 KB    | ##################################### | 100% 
Preparing transaction: done
Verifying transaction: done
Executing transaction: done

Note: you may need to restart the kernel to use updated packages.
In [16]:
hdf5_file = 'NEON_D14_SRER_DP3_502000_3523000_reflectance.h5'
hdf5= h5py.File(hdf5_file, "r")

Explore NEON AOP HDF5 Reflectance Files¶

We can look inside the HDF5 dataset with the h5py visititems function.¶

The function defined below displays all datasets

stored in the hdf5 file and their locations within the hdf5 file¶

In [17]:
#list_dataset lists the names of datasets in an hdf5 file
def list_dataset(name,node):
    if isinstance(node, h5py.Dataset):
        print(name)

hdf5.visititems(list_dataset)
SRER/Reflectance/Metadata/Ancillary_Imagery/Aerosol_Optical_Depth
SRER/Reflectance/Metadata/Ancillary_Imagery/Aspect
SRER/Reflectance/Metadata/Ancillary_Imagery/Cast_Shadow
SRER/Reflectance/Metadata/Ancillary_Imagery/Dark_Dense_Vegetation_Classification
SRER/Reflectance/Metadata/Ancillary_Imagery/Data_Selection_Index
SRER/Reflectance/Metadata/Ancillary_Imagery/Haze_Cloud_Water_Map
SRER/Reflectance/Metadata/Ancillary_Imagery/Illumination_Factor
SRER/Reflectance/Metadata/Ancillary_Imagery/Path_Length
SRER/Reflectance/Metadata/Ancillary_Imagery/Sky_View_Factor
SRER/Reflectance/Metadata/Ancillary_Imagery/Slope
SRER/Reflectance/Metadata/Ancillary_Imagery/Smooth_Surface_Elevation
SRER/Reflectance/Metadata/Ancillary_Imagery/Visibility_Index_Map
SRER/Reflectance/Metadata/Ancillary_Imagery/Water_Vapor_Column
SRER/Reflectance/Metadata/Ancillary_Imagery/Weather_Quality_Indicator
SRER/Reflectance/Metadata/Coordinate_System/Coordinate_System_String
SRER/Reflectance/Metadata/Coordinate_System/EPSG Code
SRER/Reflectance/Metadata/Coordinate_System/Map_Info
SRER/Reflectance/Metadata/Coordinate_System/Proj4
SRER/Reflectance/Metadata/Logs/175032/ATCOR_Input_file
SRER/Reflectance/Metadata/Logs/175032/ATCOR_Processing_Log
SRER/Reflectance/Metadata/Logs/175032/Shadow_Processing_Log
SRER/Reflectance/Metadata/Logs/175032/Skyview_Processing_Log
SRER/Reflectance/Metadata/Logs/175032/Solar_Azimuth_Angle
SRER/Reflectance/Metadata/Logs/175032/Solar_Zenith_Angle
SRER/Reflectance/Metadata/Logs/175509/ATCOR_Input_file
SRER/Reflectance/Metadata/Logs/175509/ATCOR_Processing_Log
SRER/Reflectance/Metadata/Logs/175509/Shadow_Processing_Log
SRER/Reflectance/Metadata/Logs/175509/Skyview_Processing_Log
SRER/Reflectance/Metadata/Logs/175509/Solar_Azimuth_Angle
SRER/Reflectance/Metadata/Logs/175509/Solar_Zenith_Angle
SRER/Reflectance/Metadata/Logs/175923/ATCOR_Input_file
SRER/Reflectance/Metadata/Logs/175923/ATCOR_Processing_Log
SRER/Reflectance/Metadata/Logs/175923/Shadow_Processing_Log
SRER/Reflectance/Metadata/Logs/175923/Skyview_Processing_Log
SRER/Reflectance/Metadata/Logs/175923/Solar_Azimuth_Angle
SRER/Reflectance/Metadata/Logs/175923/Solar_Zenith_Angle
SRER/Reflectance/Metadata/Logs/180334/ATCOR_Input_file
SRER/Reflectance/Metadata/Logs/180334/ATCOR_Processing_Log
SRER/Reflectance/Metadata/Logs/180334/Shadow_Processing_Log
SRER/Reflectance/Metadata/Logs/180334/Skyview_Processing_Log
SRER/Reflectance/Metadata/Logs/180334/Solar_Azimuth_Angle
SRER/Reflectance/Metadata/Logs/180334/Solar_Zenith_Angle
SRER/Reflectance/Metadata/Spectral_Data/FWHM
SRER/Reflectance/Metadata/Spectral_Data/Wavelength
SRER/Reflectance/Metadata/to-sensor_azimuth_angle
SRER/Reflectance/Metadata/to-sensor_zenith_angle
SRER/Reflectance/Reflectance_Data

You can see that there is a lot of information stored inside this reflectance hdf5 file.¶

Most of this information is metadata (data about ### the reflectance data), for example,¶

this file stores input parameters used in the atmospheric correction.¶

We will work with the reflectance data (hyperspectral cube), and the corresponding geospatial information,¶

stored in Metadata/Coordinate_System:¶

HARV/Reflectance/Reflectance_Data¶

HARV/Reflectance/Metadata/Coordinate_System/¶

We can also display the name, shape, and type of each of these datasets using the

function defined below, which is also called with the visititems method:¶

In [18]:
#ls_dataset displays the name, shape, and type of datasets in hdf5 file
def ls_dataset(name,node):
    if isinstance(node, h5py.Dataset):
        print(node)
In [19]:
# Use the visititems methods to learn abotu the data 
hdf5.visititems(ls_dataset)
# Look at the the last line printed, that is teh data we will work with 
<HDF5 dataset "Aerosol_Optical_Depth": shape (1000, 1000), type "<i2">
<HDF5 dataset "Aspect": shape (1000, 1000), type "<f4">
<HDF5 dataset "Cast_Shadow": shape (1000, 1000), type "|u1">
<HDF5 dataset "Dark_Dense_Vegetation_Classification": shape (1000, 1000), type "|u1">
<HDF5 dataset "Data_Selection_Index": shape (1000, 1000), type "<i4">
<HDF5 dataset "Haze_Cloud_Water_Map": shape (1000, 1000), type "|u1">
<HDF5 dataset "Illumination_Factor": shape (1000, 1000), type "|u1">
<HDF5 dataset "Path_Length": shape (1000, 1000), type "<f4">
<HDF5 dataset "Sky_View_Factor": shape (1000, 1000), type "|u1">
<HDF5 dataset "Slope": shape (1000, 1000), type "<f4">
<HDF5 dataset "Smooth_Surface_Elevation": shape (1000, 1000), type "<f4">
<HDF5 dataset "Visibility_Index_Map": shape (1000, 1000), type "|u1">
<HDF5 dataset "Water_Vapor_Column": shape (1000, 1000), type "<f4">
<HDF5 dataset "Weather_Quality_Indicator": shape (1000, 1000, 3), type "|u1">
<HDF5 dataset "Coordinate_System_String": shape (), type "|O">
<HDF5 dataset "EPSG Code": shape (), type "|O">
<HDF5 dataset "Map_Info": shape (), type "|O">
<HDF5 dataset "Proj4": shape (), type "|O">
<HDF5 dataset "ATCOR_Input_file": shape (), type "|O">
<HDF5 dataset "ATCOR_Processing_Log": shape (), type "|O">
<HDF5 dataset "Shadow_Processing_Log": shape (), type "|O">
<HDF5 dataset "Skyview_Processing_Log": shape (), type "|O">
<HDF5 dataset "Solar_Azimuth_Angle": shape (), type "<f4">
<HDF5 dataset "Solar_Zenith_Angle": shape (), type "<f4">
<HDF5 dataset "ATCOR_Input_file": shape (), type "|O">
<HDF5 dataset "ATCOR_Processing_Log": shape (), type "|O">
<HDF5 dataset "Shadow_Processing_Log": shape (), type "|O">
<HDF5 dataset "Skyview_Processing_Log": shape (), type "|O">
<HDF5 dataset "Solar_Azimuth_Angle": shape (), type "<f4">
<HDF5 dataset "Solar_Zenith_Angle": shape (), type "<f4">
<HDF5 dataset "ATCOR_Input_file": shape (), type "|O">
<HDF5 dataset "ATCOR_Processing_Log": shape (), type "|O">
<HDF5 dataset "Shadow_Processing_Log": shape (), type "|O">
<HDF5 dataset "Skyview_Processing_Log": shape (), type "|O">
<HDF5 dataset "Solar_Azimuth_Angle": shape (), type "<f4">
<HDF5 dataset "Solar_Zenith_Angle": shape (), type "<f4">
<HDF5 dataset "ATCOR_Input_file": shape (), type "|O">
<HDF5 dataset "ATCOR_Processing_Log": shape (), type "|O">
<HDF5 dataset "Shadow_Processing_Log": shape (), type "|O">
<HDF5 dataset "Skyview_Processing_Log": shape (), type "|O">
<HDF5 dataset "Solar_Azimuth_Angle": shape (), type "<f4">
<HDF5 dataset "Solar_Zenith_Angle": shape (), type "<f4">
<HDF5 dataset "FWHM": shape (426,), type "<f4">
<HDF5 dataset "Wavelength": shape (426,), type "<f4">
<HDF5 dataset "to-sensor_azimuth_angle": shape (1000, 1000), type "<f4">
<HDF5 dataset "to-sensor_zenith_angle": shape (1000, 1000), type "<f4">
<HDF5 dataset "Reflectance_Data": shape (1000, 1000, 426), type "<i2">

Now that we can see the structure of the hdf5 file¶

let's take a look at some of the information that is stored inside it.¶

Let's start by extracting the reflectance data, which is nested under¶

HARV/Reflectance/Reflectance_Data:¶

Guess where did HARV below came from?

In [20]:
hdf5
Out[20]:
<HDF5 file "NEON_D14_SRER_DP3_502000_3523000_reflectance.h5" (mode r)>
In [21]:
surf_refl = hdf5['SRER']['Reflectance']
print(surf_refl)
<HDF5 group "/SRER/Reflectance" (2 members)>

These two members of the HDF5 group /HARV/Reflectance are Metadata and Reflectance_Data.¶

Let's save the reflectance data as the variable so we can work with it:

In [22]:
reflArray = surf_refl['Reflectance_Data']
print(reflArray)
## Notice the data dimension and shape
<HDF5 dataset "Reflectance_Data": shape (1000, 1000, 426), type "<i2">

Extract the size/shape of this reflectance array that we have justed extracted using the shape method:¶

In [26]:
refl_shape = reflArray.shape
print('Reflectance Data Dimensions:',refl_shape)
Reflectance Data Dimensions: (1000, 1000, 426)
In [24]:
mapInfo = surf_refl['Metadata']['Coordinate_System']['Map_Info']
print('Data Cube Map Info: ',mapInfo[()])
#First convert mapInfo to a string
mapInfo_string = str(mapInfo[()]) #convert to string
#split the strings using the separator comma "," 
mapInfo_split = mapInfo_string.split(",") 
print(mapInfo_split)
#Extract the upper left-hand corner coordinates from mapInfo
xMin = float(mapInfo_split[3]) 
yMax = float(mapInfo_split[4])

#Calculate the xMax and yMin values from the dimensions using the array size and pixel size 
xMax = xMin + (refl_shape[1]*res[0]) #xMax = left edge + (# of columns * x pixel resolution)
yMin = yMax - (refl_shape[0]*res[1]) #yMin = top edge - (# of rows * y pixel resolution)

#Define extent as a tuple:
cube_ext = (xMin, xMax, yMin, yMax)
print('cube_ext: ',cube_ext)
print('cube_ext type: ',type(cube_ext))
Data Cube Map Info: b'UTM,  1.000,  1.000,       502000.00,       3524000.0,       1.0000000,       1.0000000,  12,  North,  WGS-84,  units=Meters, 0'
["b'UTM", '  1.000', '  1.000', '       502000.00', '       3524000.0', '       1.0000000', '       1.0000000', '  12', '  North', '  WGS-84', '  units=Meters', " 0'"]
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
Input In [24], in <module>
     10 yMax = float(mapInfo_split[4])
     12 #Calculate the xMax and yMin values from the dimensions using the array size and pixel size 
---> 13 xMax = xMin + (refl_shape[1]*res[0]) #xMax = left edge + (# of columns * x pixel resolution)
     14 yMin = yMax - (refl_shape[0]*res[1]) #yMin = top edge - (# of rows * y pixel resolution)
     16 #Define extent as a tuple:

NameError: name 'res' is not defined

</center>

NEON hyperspectral data contain 426 spectral bands, and when working with tiled data,¶

the spatial dimensions are 1000 x 1000, where each pixel represents 1 meter.¶

Now let's take a look at the wavelength values. First, we will extract wavelength¶

information from the harv_refl variable that we created:¶

In [27]:
#define the wavelengths variable
wavelengths = surf_refl['Metadata']['Spectral_Data']['Wavelength']

#View wavelength information and values
print('wavelengths:',wavelengths)
wavelengths: <HDF5 dataset "Wavelength": shape (426,), type "<f4">

We can use numpy (imported as np) to see the minimum and maximum wavelength values:¶

In [28]:
# Display min & max wavelengths
print('min wavelength:', np.amin(wavelengths),'nm')
print('max wavelength:', np.amax(wavelengths),'nm')

## Notice that we are not having to define these manually and that we are using what we call metdata to get this info
min wavelength: 381.5437 nm
max wavelength: 2509.932 nm

Now find out the band widths (distance between center bands of two adjacent bands).¶

Let's try this for the first two bands and the last two bands. Remember that Python¶

uses 0-based indexing ([0] represents the first value in an array), and note that you¶

can also use negative numbers to splice values from the end of an array¶

([-1] represents the last value in an array).¶

In [29]:
#show the band widths between the first 2 bands and last 2 bands 
print('band width between first 2 bands =',(wavelengths[1]-wavelengths[0]),'nm')
print('band width between last 2 bands =',(wavelengths[-1]-wavelengths[-2]),'nm')
band width between first 2 bands = 5.0079956 nm
band width between last 2 bands = 5.0080566 nm

The center wavelengths recorded in this hyperspectral cube range from 383.7655 - 2512.097 nm,¶

and each band covers a range of ~5 nm. Now let's extract spatial information, which is stored under¶

HARV/Reflectance/Metadata/Coordinate_System/Map_Info¶

This will tell us where the data came from geographically (in this case UTM coordinate system. We will learn later what that is)¶

In [30]:
mapInfo = surf_refl['Metadata']['Coordinate_System']['Map_Info']
print('Data Cube Map Info:',mapInfo[()])
Data Cube Map Info: b'UTM,  1.000,  1.000,       502000.00,       3524000.0,       1.0000000,       1.0000000,  12,  North,  WGS-84,  units=Meters, 0'

Understanding the output :

Here we can get the spatial information about the reflectance data.¶

Below is a break down of what each of these values means/represent:¶

UTM - coordinate system (Universal Transverse Mercator)¶

1.000, 1.000,725000.00,4713000.0 UTM - coordinates (meters) of the map origin, which refers to the upper-left corner of the image (xMin, yMax).¶

1.0000000,1.0000000 - pixel resolution (meters)¶

18 - UTM zone¶

North - UTM hemisphere (North for all NEON sites)¶

WGS-84 - reference ellipoid¶

units=Meters'¶

Now, let's extract relevant information from the Map_Info metadata to define the spatial extent of this dataset.¶

To do this, we can use the split method to break up this string into separate values:¶

In [31]:
#First convert mapInfo to a string
mapInfo_string = str(mapInfo[()]) #convert to string
In [32]:
#split the strings using the separator "," 
mapInfo_split = mapInfo_string.split(",") 
print(mapInfo_split)
["b'UTM", '  1.000', '  1.000', '       502000.00', '       3524000.0', '       1.0000000', '       1.0000000', '  12', '  North', '  WGS-84', '  units=Meters', " 0'"]

Now we can extract the spatial information we need from the map info values,¶

convert them to the appropriate data type (float) and store it in a way that¶

will enable us to access and apply it later when we want to plot the data:¶

In [33]:
#Extract the resolution & convert to floating decimal number
res = float(mapInfo_split[5]),float(mapInfo_split[6])
print('Resolution:',res)
Resolution: (1.0, 1.0)
In [34]:
#Extract the upper left-hand corner coordinates from mapInfo
xMin = float(mapInfo_split[3]) 
yMax = float(mapInfo_split[4])

#Calculate the xMax and yMin values from the dimensions
xMax = xMin + (refl_shape[1]*res[0]) #xMax = left edge + (# of columns * x pixel resolution)
yMin = yMax - (refl_shape[0]*res[1]) #yMin = top edge - (# of rows * y pixel resolution)

Now we can define the spatial exten as a PYTHON tuple [go back to the reference document I shared with you to understand what a TUPLE is]¶

(xMin, xMax, yMin, yMax). This is the format required for applying the spatial extent when plotting with matplotlib.pyplot¶

In [35]:
#Define extent as a tuple:
cube_ext = (xMin, xMax, yMin, yMax)
print('cube_ext:',cube_ext)
print('cube_ext type:',type(cube_ext))
cube_ext: (502000.0, 503000.0, 3523000.0, 3524000.0)
cube_ext type: <class 'tuple'>

Extract a Single Band from Array¶

While it is useful to have all the data contained in a hyperspectral cube,¶

it is difficult to visualize all this information at once. We can extract a¶

single band (representing a ~5nm spectral slive, approximating a single wavelength)¶

from the cube by using splicing as follows.¶

Note that we have to cast the reflectance data into the type float.¶

Recall that since Python indexing starts at 0 instead of 1, in order to extract Band 56,¶

we need to use the index 55¶

In [36]:
Band_56 = reflArray[:,:,55].astype(float)
print('b56 type:',type(Band_56))
print('b56 shape:',Band_56.shape)
print('Band 56 Reflectance:\n',Band_56)
b56 type: <class 'numpy.ndarray'>
b56 shape: (1000, 1000)
Band 56 Reflectance:
 [[ 720.  787. 1125. ... 1580.  983.  840.]
 [ 929. 1289. 1289. ... 1661.  989.  989.]
 [ 765. 1038. 1475. ... 1928. 1080.  670.]
 ...
 [1383. 1425. 1757. ... 3972. 4721. 4826.]
 [1344. 1475. 1711. ... 4904. 4711. 4631.]
 [1565. 1709.  924. ... 4288. 4378. 3744.]]

Here we can see that we extracted a 2-D array (1000 x 1000) of the scaled reflectance data¶

corresponding to the wavelength band 56. Before we can use the data, we need to clean it up¶

a little. We'll show how to do this below.¶

Scale factor and No Data Value¶

This array represents the scaled reflectance for band 56.¶

NEON AOP reflectance data uses a Data_Ignore_Value of -9999 to represent missing data (often called NaN),¶

and a reflectance Scale_Factor of 10000.0 in order to save disk space (can use lower precision this way).¶

We can extract and apply the Data_Ignore_Value and Scale_Factor as follows:¶

In [37]:
#View and apply scale factor and data ignore value
scaleFactor = reflArray.attrs['Scale_Factor']
noDataValue = reflArray.attrs['Data_Ignore_Value']
print('Scale Factor:',scaleFactor)
print('Data Ignore Value:',noDataValue)

Band_56[Band_56==int(noDataValue)]=np.nan
Band_56 = Band_56/scaleFactor
print('Cleaned Band 56 Reflectance:\n',Band_56)
Scale Factor: 10000.0
Data Ignore Value: -9999.0
Cleaned Band 56 Reflectance:
 [[0.072  0.0787 0.1125 ... 0.158  0.0983 0.084 ]
 [0.0929 0.1289 0.1289 ... 0.1661 0.0989 0.0989]
 [0.0765 0.1038 0.1475 ... 0.1928 0.108  0.067 ]
 ...
 [0.1383 0.1425 0.1757 ... 0.3972 0.4721 0.4826]
 [0.1344 0.1475 0.1711 ... 0.4904 0.4711 0.4631]
 [0.1565 0.1709 0.0924 ... 0.4288 0.4378 0.3744]]

Plot single reflectance band¶

Now we can plot this band using the Python package matplotlib.pyplot,¶

which we imported at the beginning of the lesson as plt.¶

Note that the default colormap is jet unless otherwise specified.¶

You can explore using different colormaps on your own; see the mapplotlib¶

colormaps for for other options.¶

In [38]:
data_plot = plt.imshow(Band_56,extent=cube_ext,cmap='Greys') 

The image above looks very washed out, and this is due to the range and distribution of reflectance values¶

that we are plotting. We can analyze the data via a histogram to see its distribution.¶

Plot histogram¶

We can plot a histogram using the matplotlib.pyplot.hist function.¶

Note that this function won't work if there are any NaN values,¶

so we can ensure we are only plotting the real data values using the call below.¶

You can also specify the # of bins you want to divide the data into.¶

In [39]:
plt.hist(Band_56[~np.isnan(Band_56)],50); #50 signifies the # of bins
In [40]:
### We can see that most of the reflectance values are within the range 0 - 0.4 
#### In order to show more contrast in the image, we can adjust the colorlimit (clim) to 0-0.4 (or 0 to 0.2 only)
In [41]:
data_plot = plt.imshow(Band_56,extent=cube_ext,cmap='gist_earth',clim=(0,0.4)) 
plt.title('Data Cube Band 56 Reflectance');

Now that you know how to load a NEON-AOP data cube form HDF 5 data format and read any data from it, let us wortk with the cntent¶

In [42]:
# Assign some common bands by name and number. Feel free to experiment 
# and use other regions of the spectrum that correspond to the color of interest.
# Review the references slides in the Lab. presentation about where these band numbers come from 
bandRED=48
bandGREEN=34
bandBLUE=17
bandNIR=97
In [43]:
# Combine the Red, Green and Blue data into an RGB image for display
print("Creating RGB Image...")
RGBImage=vip.Image_getRGB(reflArray[:,:,bandRED],reflArray[:,:,bandGREEN],reflArray[:,:,bandBLUE],5000)

# Display RGB True color Image
plt.figure(figsize=(10,10))
plt.title('RGB True color')
plt.axis('off')
plt.imshow(RGBImage) 
Creating RGB Image...
Out[43]:
<matplotlib.image.AxesImage at 0x7f9865778670>
In [60]:
# create Spectral Convolution object
NEON=vipConv.SpecConvolution()

#Load NEON wavelength values that match each NEON band
#WavesList=NEON.Load_Wavesfile('./Data/NEON_wavelength_values.txt')
# Will use the list loaded above 
WavesList=wavelengths

#Load simulated sensor RSR function
NEON.Load_RSR('NEON_Sentinel2A_RSR.csv')

#display RSR basic information
NEON.displayRSR()

#Retrieve RSR for a specific band
S2_RSR_RED=NEON.getBand_RSR('RED')

S2_RSR_NIR=NEON.getBand_RSR('NIR')

#create a plot of the RSR
plt.figure()
plt.title('RSR Sensor - Put NAME Here')
plt.xlabel('Wavelength (nm)')
plt.ylabel('RSR')

#RSR_RED[:,0] is the X Axis
#RSR_RED[:,1] is the Y Axis
plt.plot(S2_RSR_RED[:,0],S2_RSR_RED[:,1],color='red')
plt.plot(S2_RSR_NIR[:,0],S2_RSR_NIR[:,1],color='black')
plt.grid(True)
plt.tight_layout()
plt.show()
Reading RSR: NEON_Sentinel2A_RSR.csv
Convolution:
Sensor: SENTINEL2A
Resolution: 10.0 meters
Bands: ['BLUE', 'GREEN', 'RED', 'NIR']

Now create and add plots for the other bands¶

In [46]:
#apply convolution to dataset
SENTINELCube=NEON.Spectral_Spatial_Convolution(WavesList,reflArray)
Processing convolution...
 band: BLUE
 min= 440.0 max= 535.0
  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31 .
 band: GREEN
 min= 537.0 max= 582.0
  32  33  34  35  36  37  38  39  40  41 .
 band: RED
 min= 646.0 max= 684.0
  53  54  55  56  57  58  59  60  61 .
 band: NIR
 min= 773.0 max= 908.0
  79  80  81  82  83  84  85  86  87  88  89  90  91  92  93  94  95  96  97  98  99  100  101  102  103  104  105  106 .
Spatial Resampling at  10.0  [ 100 , 100 ]
In [47]:
RGBImage=vip.Image_getRGB(SENTINELCube[:,:,2],SENTINELCube[:,:,1],SENTINELCube[:,:,0],8000)
# Display RGB True color Image
plt.figure(figsize=(10,10))
plt.title('SENTINEL simulated RGB at actual Sentinel spatial resolution')
plt.imshow(RGBImage)
Out[47]:
<matplotlib.image.AxesImage at 0x7f984bce9d00>
In [48]:
# Rerun Convolution at a different spatial resolution, 5 meters
SENTINELCube=NEON.Spectral_Spatial_Convolution(WavesList,reflArray,5)
RGBImage=vip.Image_getRGB(SENTINELCube[:,:,2],SENTINELCube[:,:,1],SENTINELCube[:,:,0],8000)
Processing convolution...
 band: BLUE
 min= 440.0 max= 535.0
  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31 .
 band: GREEN
 min= 537.0 max= 582.0
  32  33  34  35  36  37  38  39  40  41 .
 band: RED
 min= 646.0 max= 684.0
  53  54  55  56  57  58  59  60  61 .
 band: NIR
 min= 773.0 max= 908.0
  79  80  81  82  83  84  85  86  87  88  89  90  91  92  93  94  95  96  97  98  99  100  101  102  103  104  105  106 .
Spatial Resampling at  5  [ 200 , 200 ]
In [49]:
# Display RGB True color Image
plt.figure(figsize=(10,10))
plt.title('SENTINEL simulated RGB at hypothetical 5 meters')
plt.imshow(RGBImage)
Out[49]:
<matplotlib.image.AxesImage at 0x7f984bcf4b80>
In [50]:
# ReRun only spectral Convolution (not spatial) 
SENTINELCube=NEON.Spectral_Convolution(WavesList,reflArray)
RGBImage=vip.Image_getRGB(SENTINELCube[:,:,2],SENTINELCube[:,:,1],SENTINELCube[:,:,0],8000)
Processing convolution...
 band: BLUE
 min= 440.0 max= 535.0
  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31 .
 band: GREEN
 min= 537.0 max= 582.0
  32  33  34  35  36  37  38  39  40  41 .
 band: RED
 min= 646.0 max= 684.0
  53  54  55  56  57  58  59  60  61 .
 band: NIR
 min= 773.0 max= 908.0
  79  80  81  82  83  84  85  86  87  88  89  90  91  92  93  94  95  96  97  98  99  100  101  102  103  104  105  106 .
In [51]:
# Display RGB True color Image
plt.figure(figsize=(10,10))
plt.title('SENTINEL simulated but no spatial convolution')
plt.imshow(RGBImage,extent=cube_ext)
Out[51]:
<matplotlib.image.AxesImage at 0x7f984d995850>

To do ¶

  • Use the example code above for Sentinel 2 (ESA platform)
  • Simulate the following sensors from NEON:
  • Landsat 8 OLI, MODIS/VIIRS [but at 50m resolution – We do not have enough coverage to simulate the actual 250m/300m data]
  • Generate plots/images with all RSRs for all bands and all sensors (like the example above)
  • Extract different (objects) Spectral Signatures then plot from original Data Cube and
  • simulated sensor data cube
In [80]:
#Load simulated sensor RSR function
NEON.Load_RSR('NEON_Landsat8_RSR.csv')

#display RSR basic information
NEON.displayRSR()

#Retrieve RSR for a specific band
L8_RSR_RED=NEON.getBand_RSR('RED')
L8_RSR_NIR=NEON.getBand_RSR('NIR')

#Load simulated sensor RSR function
NEON.Load_RSR('NEON_MODIS_RSR_New.csv')

#display RSR basic information
NEON.displayRSR()

#Retrieve RSR for a specific band
M_RSR_RED=NEON.getBand_RSR('RED')
M_RSR_NIR=NEON.getBand_RSR('NIR')

#Load simulated sensor RSR function
NEON.Load_RSR('NEON_VIIRS_RSR.csv')

#display RSR basic information
NEON.displayRSR()

#Retrieve RSR for a specific band
V_RSR_RED=NEON.getBand_RSR('RED')
V_RSR_NIR=NEON.getBand_RSR('NIR')
Reading RSR: NEON_Landsat8_RSR.csv
Convolution:
Sensor: LANDSAT8/OLI
Resolution: 30.0 meters
Bands: ['BLUE', 'GREEN', 'RED', 'NIR']
Reading RSR: NEON_MODIS_RSR_New.csv
Convolution:
Sensor: MODIS/AQUA
Resolution: 250.0 meters
Bands: ['BLUE', 'GREEN', 'RED', 'NIR']
Reading RSR: NEON_VIIRS_RSR.csv
Convolution:
Sensor: VIIRS
Resolution: 500.0 meters
Bands: ['BLUE', 'GREEN', 'RED', 'NIR']
In [124]:
#create a plot of the RSR
plt.figure(figsize=(15,5))
plt.title('RSR Sensor - Landsat 8')
plt.xlabel('Wavelength (nm)')
plt.ylabel('RSR')
#RSR_RED[:,0] is the X Axis
#RSR_RED[:,1] is the Y Axis
plt.plot(M_RSR_RED[:,0],M_RSR_RED[:,1],'-+',color='red',label='RED MODIS')
plt.plot(M_RSR_NIR[:,0],M_RSR_NIR[:,1],'-+',color='black',label='NIR MODIS')

plt.plot(V_RSR_RED[:,0],V_RSR_RED[:,1],'-o',color='red',label='RED VIIRS')
plt.plot(V_RSR_NIR[:,0],V_RSR_NIR[:,1],'-o',color='black',label='NIR VIIRS')

plt.plot(L8_RSR_RED[:,0],L8_RSR_RED[:,1],'-.',color='red',label='RED Landsat 8')
plt.plot(L8_RSR_NIR[:,0],L8_RSR_NIR[:,1],'-.',color='black',label='NIR Landsat 8')

plt.plot(S2_RSR_RED[:,0],S2_RSR_RED[:,1],color='red', label='RED Sentinel 2')
plt.plot(S2_RSR_NIR[:,0],S2_RSR_NIR[:,1],color='black', label='NIR Sentinel 2')

plt.grid(True)
plt.legend(loc=[0,-0.5], ncol=4)
plt.tight_layout()
plt.show()
In [53]:
#apply convolution to dataset
LANDSATCube=NEON.Spectral_Spatial_Convolution(WavesList,reflArray)
Processing convolution...
 band: BLUE
 min= 436.0 max= 527.0
  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30 .
 band: GREEN
 min= 513.0 max= 600.0
  27  28  29  30  31  32  33  34  35  36  37  38  39  40  41  42  43  44 .
 band: RED
 min= 626.0 max= 682.0
  49  50  51  52  53  54  55  56  57  58  59  60 .
 band: NIR
 min= 830.0 max= 896.0
  90  91  92  93  94  95  96  97  98  99  100  101  102  103 .
Spatial Resampling at  30.0  [ 33 , 33 ]
In [54]:
RGBImage=vip.Image_getRGB(LANDSATCube[:,:,2],LANDSATCube[:,:,1],LANDSATCube[:,:,0],8000)
# Display RGB True color Image
plt.figure(figsize=(10,10))
plt.title('LANDSAT 8 simulated RGB at actual spatial resolution')
plt.imshow(RGBImage)
Out[54]:
<matplotlib.image.AxesImage at 0x7f984cea9160>
In [56]:
# Rerun Convolution at a different spatial resolution, 5 meters
LANDSATCube=NEON.Spectral_Spatial_Convolution(WavesList,reflArray,5)
RGBImage=vip.Image_getRGB(LANDSATCube[:,:,2],LANDSATCube[:,:,1],LANDSATCube[:,:,0],8000)
Processing convolution...
 band: BLUE
 min= 436.0 max= 527.0
  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30 .
 band: GREEN
 min= 513.0 max= 600.0
  27  28  29  30  31  32  33  34  35  36  37  38  39  40  41  42  43  44 .
 band: RED
 min= 626.0 max= 682.0
  49  50  51  52  53  54  55  56  57  58  59  60 .
 band: NIR
 min= 830.0 max= 896.0
  90  91  92  93  94  95  96  97  98  99  100  101  102  103 .
Spatial Resampling at  5  [ 200 , 200 ]
In [57]:
# Display RGB True color Image
plt.figure(figsize=(10,10))
plt.title('LANDSAT 8 simulated RGB at hypothetical 5 meters')
plt.imshow(RGBImage)
Out[57]:
<matplotlib.image.AxesImage at 0x7f9865781d30>
In [58]:
# ReRun only spectral Convolution (not spatial) 
LANDSATCube=NEON.Spectral_Convolution(WavesList,reflArray)
RGBImage=vip.Image_getRGB(LANDSATCube[:,:,2],LANDSATCube[:,:,1],LANDSATCube[:,:,0],8000)
Processing convolution...
 band: BLUE
 min= 436.0 max= 527.0
  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30 .
 band: GREEN
 min= 513.0 max= 600.0
  27  28  29  30  31  32  33  34  35  36  37  38  39  40  41  42  43  44 .
 band: RED
 min= 626.0 max= 682.0
  49  50  51  52  53  54  55  56  57  58  59  60 .
 band: NIR
 min= 830.0 max= 896.0
  90  91  92  93  94  95  96  97  98  99  100  101  102  103 .
In [59]:
# Display RGB True color Image
plt.figure(figsize=(10,10))
plt.title('LANDSAT 8 simulated but no spatial convolution')
plt.imshow(RGBImage,extent=cube_ext)
Out[59]:
<matplotlib.image.AxesImage at 0x7f984ced5c10>
In [188]:
#display a message the program ended
print("program ended.")
program ended.

EX-2

In this exercise you will learn how to design and use a mask to assist with data and images analysis.
A Mask is usually a thematic (discrete values) image made up of ZEROS [0] and ONES [1], or other numbers/classes.

Original Imager Mask 1 Mask 2

The mask(s) define(s) the locations and extent of each features.
To understand the behavior (ex: spectral) of remote sensing data over a particular object/feature we can run a spectral signature (or analysis) over a single or few pixels from the feature, or better yet (Sample) average the spectral signatures of many pixels corresponding to that feature, or even all of them. But how?.
Masking provides an answer, and may contain more than a single feature and result from more than one test/method. The analysis will then be applied to only the feature(s) of interest.

In [88]:
# load library modules 
import os
import numpy as np
import matplotlib.pyplot as plt
import viplab_lib3 as vip
In [89]:
#Load NEON Dataset    
filename="NEON_Composed_Img.bsq"
nrows=500
ncols=500
nbands = 426
datatype=np.int16
# Read the wavelength values from the textfile
file = open('NEON_wavelength_values.txt', 'r') 
Xvalues= file.readlines() 
nvalues=len(Xvalues)
#convert text to number (float)
for i in range(0,nvalues):
  Xvalues[i]=float(Xvalues[i]) 

# assign some bands with an index to easy remember them
bandRED=53
bandGREEN=37
bandBLUE=18
bandNIR=98
bandSWIR1=246
bandSWIR2=340
In [90]:
# Read all bands into a single DataCube
DataCube=vip.BSQ_band_read(filename,-426,nrows,ncols)

# Combine the Red, Green and Blue data into an RGB model for display
print("Creating RGB Image...")

# Extract some bands from the cube
DataRED=DataCube[:,:,bandRED]
DataGREEN=DataCube[:,:,bandGREEN]
DataBLUE=DataCube[:,:,bandBLUE]
DataNIR=DataCube[:,:,bandNIR]

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
plt.figure(figsize=(8,8))
plt.axis('off')

plt.title("NEON RGB - This a composite image made up of pieces")
plt.imshow(RGBImage)
nbands= 426
Creating RGB Image...
Out[90]:
<matplotlib.image.AxesImage at 0x7f9857612640>

Let us try the simplest form of masking based on a threshold technique.¶

Create a theshold based mask (this is similar to Machine learning - the label here is the threshold).¶
1. In the following examples we use various Red, NIR, and NDVI (We will learn abotu NDVI) thresholds to define how a white roof, water, tree, etc.. behaves¶
2. Experiment with the code to create different masks for different features.¶
3. Try to use more than one band to redefine the mask(s) and see how that improves the mask performance.¶
In [91]:
# Compute Stat values for a RED band
RED_avg=DataRED.mean()
RED_stdev=DataRED.std()
print("Full Image RED: avg=",RED_avg,", stdev=",RED_stdev)
Full Image RED: avg= 1118.491816 , stdev= 1341.629444353776

Define a new index called NDVI = (NIR-Red)/(NIR+Red)¶

More on this in later laectures¶

In [92]:
#Use masks to compute stats about objects only
# Automatic mask creation:
# Create a mask to select white roofs (using the RED band only for example) 

# An example of a more complex mask with NDVI
# NDVI = (NIR-red)/(NIR+red)

#Notice how easy it is in Python to run complex data processing, in this case creating NDVI is one line of code
NDVI_Numerator=DataNIR-DataRED
NDVI_Denominator=DataNIR+DataRED

# In order to avoid division by zero we need to be careful, the code below does some checking befoe outputing NDVI 
with np.errstate(divide='ignore', invalid='ignore'):
    NDVI =np.true_divide(NDVI_Numerator,NDVI_Denominator)
    NDVI[NDVI == np.inf] = 0
    NDVI = np.nan_to_num(NDVI)

Find all the roofs in the image, How?¶

A primitive way would be to "say All roofs have RED signatures > 5000 [0.5] since they are bright"¶

Notice the results and think of how it could be improved¶

In [93]:
#Find roofs by using a red band threshold mask     
MaskRoof=DataRED > 5000 

fig, figarr = plt.subplots(1, 2, figsize=(12,6))
fig.tight_layout()

figarr[0].set_title("Roof Mask ")
figarr[0].axis('off')
figarr[0].imshow(MaskRoof)

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
figarr[1].axis('off')

figarr[1].set_title("NEON RGB - This a composite image made up of pieces")
figarr[1].imshow(RGBImage)
Out[93]:
<matplotlib.image.AxesImage at 0x7f985750d280>

Find all the water bodies in the image, How?¶

Again a simple way would be to "say All water" have NIR < 100, dark"¶

Notice the results and think of how it could be improved¶

Notice how you dicovered a bunch of swimming pools¶

In [94]:
#Find water bodies by using a NIR band threshold mask     
MaskWater=DataNIR < 100

fig, figarr = plt.subplots(1, 2, figsize=(12,6))
fig.tight_layout()

figarr[0].set_title("Water Mask")
figarr[0].axis('off')
figarr[0].imshow(MaskWater)

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
figarr[1].axis('off')

figarr[1].set_title("NEON RGB - This a composite image made up of pieces")
figarr[1].imshow(RGBImage)
Out[94]:
<matplotlib.image.AxesImage at 0x7f984e524eb0>

How about the Pecan Trees?¶

Again a simple way would be to "say PEcan trees" have NDVI > 0.9. This is a special topic for later for now think of it as index"¶

Notice the results and think of how it could be improved¶

In [97]:
# Mask Pecan trees (using NDVI band)
MaskPecan= NDVI > 0.90
fig, figarr = plt.subplots(1, 2, figsize=(12,6))
fig.tight_layout()

figarr[0].set_title("Pecan Trees Mask using NDVI < 0.9  threshold")
figarr[0].axis('off')
figarr[0].imshow(MaskPecan)

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
figarr[1].axis('off')

figarr[1].set_title("NEON RGB - This a composite image made up of pieces")
figarr[1].imshow(RGBImage)
Out[97]:
<matplotlib.image.AxesImage at 0x7f9856bd58e0>

How about any plants?¶

Try NDVI > 0.6¶

Notice the results and think of how it could be improved¶

In [98]:
# Mask Pecan trees (using NDVI band)
MaskPecan= NDVI > 0.60
fig, figarr = plt.subplots(1, 2, figsize=(12,6))
fig.tight_layout()

figarr[0].set_title("Plants Mask using NDVI > 0.6 threshold")
figarr[0].axis('off')
figarr[0].imshow(MaskPecan)

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
figarr[1].axis('off')

figarr[1].set_title("NEON RGB - This a composite image made up of pieces")
figarr[1].imshow(RGBImage)
Out[98]:
<matplotlib.image.AxesImage at 0x7f9856aa4700>
In [99]:
# Get the average value of the RED for the houses using the mask
# This code will travel the full image and only retains and works 
# with any pixel that meets the Mask conditions

Roof_RED_avg=np.mean(DataRED[ MaskRoof ==True ])
Roof_RED_stdev=np.std(DataRED[ MaskRoof ==True ])

# Get the average value of the RED for the water pixels
Water_RED_avg=np.mean(DataRED[ MaskWater ==True ])
Water_RED_stdev=np.std(DataRED[ MaskWater ==True ])
In [101]:
#Using a predifined mask - from D2L
# Mask values
#  1 Water
#  2 White roofs
#  3 Walnut orchard trees
#  4 Forest trees

# Read a mask from file
MaskData1= vip.BSQ_band_read('NEON_Mask.bsq',0,500,500)
MaskData2= vip.BSQ_band_read('NEON_Mask.bsq',1,500,500)
In [102]:
#Display the masks
fig, figarr = plt.subplots(1, 3, figsize=(12,6))
fig.tight_layout()

figarr[0].set_title("Mask from file")
figarr[0].axis('off')
figarr[0].imshow(MaskData1)

figarr[1].set_title("Mask from file")
figarr[1].axis('off')
figarr[1].imshow(MaskData2)

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
figarr[2].axis('off')

figarr[2].set_title("NEON RGB - This a composite image made up of pieces")
figarr[2].imshow(RGBImage)
Out[102]:
<matplotlib.image.AxesImage at 0x7f984adf0cd0>

We will now use the masks to analyze the data in the image but only for the masked area (feature).¶

In [103]:
# Get the average value of the RED for the water pixels
MaskPecan=MaskData2 == 3
Pecan_RED_avg=np.mean(DataRED[ MaskPecan ==True ])
Pecan_RED_stdev=np.std(DataRED[ MaskPecan ==True ])

MaskForest=MaskData2 == 4
Forest_RED_avg=np.mean(DataRED[ MaskForest ==True ])
Forest_RED_stdev=np.std(DataRED[ MaskForest ==True ])

print("RED stats using masks")
print("Houses roof: avg=",Roof_RED_avg,", stdev=",Roof_RED_stdev)
print("Water: avg=",Water_RED_avg,", stdev=",Water_RED_stdev)
print("Walnut Trees: avg=",Pecan_RED_avg,", stdev=",Pecan_RED_stdev)
print("Forest Trees: avg=",Forest_RED_avg,", stdev=",Pecan_RED_stdev)
RED stats using masks
Houses roof: avg= 7200.777116919706 , stdev= 1024.063550604201
Water: avg= 747.7773073666384 , stdev= 102.12372098425168
Walnut Trees: avg= 322.8010876132931 , stdev= 178.50904048131653
Forest Trees: avg= 506.33531500288996 , stdev= 178.50904048131653

Construct the average spectral signatures of each feature from the mask. We no longer look at the spectra for a single pixel¶

We will estimate the mean spectra for all pixels that are part of any of the Masked features¶

In [119]:
#Using the predifined mask - 
# Mask values
# 1 Water
# 2 White roofs
# 3 Walnut trees
# 4 Forest
# 5 Grass
# 6 Road
# 0 everything else
Objects_Label = ['Pecan Trees','Water']
Objects_Color = ['darkgreen','blue']
Avg_Spectra=np.zeros((2,426))

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[0,band]=np.average(Data2D[ MaskData2 ==3 ])

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[1,band]=np.average(Data2D[ MaskData2 ==1 ])
    
plt.figure(figsize=(12,6))
plt.title('Features Spectral Signatures',fontsize=20)
plt.xlabel('Wavelength (nm)',fontsize=12)
plt.ylabel('Reflectance ',fontsize=12)
    
for i,type in enumerate(Objects_Label):
    Spectra=Avg_Spectra[i]
    plt.plot(Xvalues,Spectra,color=Objects_Color[i],label=Objects_Label[i])
plt.grid()
plt.legend(loc='upper left')       
Out[119]:
<matplotlib.legend.Legend at 0x7f9856936430>

To Do¶

Follow the exercise code above and,¶

  1. Compute the stats for all bands over the predefined masks (BLUE, GREEN, RED, NIR, SWIR1)
  2. Modify the code to mask through the full cube and generate population (masked) spectra
  3. Average signal over mask feature for all bands (426) and generate an average plot of spectral signatures of the feature
  4. Generate your own mask(s) using what you have learned so far in terms of spectral signatures (and/or NDVI)
  5. Combine more than one thresholds/tests to generate a robust more useful mask
  6. Describe how the mask changes the spectra?

Problem 1: Compute the stats for all bands over the predefined masks (BLUE, GREEN, RED, NIR, SWIR1)¶

In [111]:
# Data SWIR1 and SWIR2 have not been created yet
DataSWIR1=DataCube[:,:,bandSWIR1]
DataSWIR2=DataCube[:,:,bandSWIR2]
In [116]:
bands = {'Blue':DataBLUE, 'Green':DataGREEN, 'Red':DataRED, 
         'NIR':DataNIR, 'SWIR1':DataSWIR1, 'SWIR2':DataSWIR2}

for x,y in bands.items():
    Data = y
    name = x

    Roof_avg=np.mean(Data[ MaskRoof ==True ])
    Roof_stdev=np.std(Data[ MaskRoof ==True ])

    Water_avg=np.mean(Data[ MaskWater ==True ])
    Water_stdev=np.std(Data[ MaskWater ==True ])

    MaskPecan=MaskData2 == 3
    Pecan_avg=np.mean(Data[ MaskPecan ==True ])
    Pecan_stdev=np.std(Data[ MaskPecan ==True ])

    MaskForest=MaskData2 == 4
    Forest_avg=np.mean(Data[ MaskForest ==True ])
    Forest_stdev=np.std(Data[ MaskForest ==True ])

    print(name, "band stats using masks")
    print("Houses roof: avg=",Roof_avg,", stdev=",Roof_stdev)
    print("Water: avg=",Water_avg,", stdev=",Water_stdev)
    print("Pecan Trees: avg=",Pecan_avg,", stdev=",Pecan_stdev)
    print("Forest Trees: avg=",Forest_avg,", stdev=",Pecan_stdev)
    print()
Blue band stats using masks
Houses roof: avg= 6334.894819220535 , stdev= 1194.442016196077
Water: avg= 374.25486875529214 , stdev= 114.5380342543871
Pecan Trees: avg= 229.69679758308158 , stdev= 76.5304814696349
Forest Trees: avg= 472.00379819998346 , stdev= 76.5304814696349

Green band stats using masks
Houses roof: avg= 6853.501956487713 , stdev= 1074.4543962789367
Water: avg= 735.9729043183743 , stdev= 132.59962507414457
Pecan Trees: avg= 547.9279758308157 , stdev= 178.34344309420035
Forest Trees: avg= 697.447774750227 , stdev= 178.34344309420035

Red band stats using masks
Houses roof: avg= 7200.777116919706 , stdev= 1024.063550604201
Water: avg= 747.7773073666384 , stdev= 102.12372098425168
Pecan Trees: avg= 322.8010876132931 , stdev= 178.50904048131653
Forest Trees: avg= 506.33531500288996 , stdev= 178.50904048131653

NIR band stats using masks
Houses roof: avg= 7291.050868680545 , stdev= 985.2079469706347
Water: avg= 70.11727349703641 , stdev= 14.895863686638762
Pecan Trees: avg= 4829.812930513595 , stdev= 588.8033590079334
Forest Trees: avg= 4845.084964082239 , stdev= 588.8033590079334

SWIR1 band stats using masks
Houses roof: avg= 6484.887462826733 , stdev= 907.4672793325856
Water: avg= 4.6685012701100765 , stdev= 7.364797051045533
Pecan Trees: avg= 1905.8157099697885 , stdev= 498.6547040104472
Forest Trees: avg= 1957.9827429609445 , stdev= 498.6547040104472

SWIR2 band stats using masks
Houses roof: avg= 4933.191579276882 , stdev= 712.961764485916
Water: avg= 4.484335309060119 , stdev= 6.789876423663231
Pecan Trees: avg= 578.5833232628398 , stdev= 284.8599892051938
Forest Trees: avg= 655.9995871521758 , stdev= 284.8599892051938

In [ ]:
 

Problems 2 and 3: Modify the code to mask through the full cube and generate population (masked) spectra; Average signal over mask feature for all bands (426) and generate an average plot of spectral signatures of the feature¶

In [123]:
#Using the predifined mask - 
# Mask values
# 1 Water
# 2 White roofs
# 3 Walnut trees
# 4 Forest
# 5 Grass
# 6 Road
# 0 everything else
Objects_Label = ['Water','White Roofs','Pecan Trees','Forest','Grass','Road','party (everything else)']
Objects_Color = ['blue','Black','brown','darkgreen','lime','grey','red']
Avg_Spectra=np.zeros((7,426))

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[0,band]=np.average(Data2D[ MaskData2 ==1 ])

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[1,band]=np.average(Data2D[ MaskData2 ==2 ])

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[2,band]=np.average(Data2D[ MaskData2 ==3 ])
    
for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[3,band]=np.average(Data2D[ MaskData2 ==4 ])
    
for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[4,band]=np.average(Data2D[ MaskData2 ==5 ])

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[5,band]=np.average(Data2D[ MaskData2 ==6 ])

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[6,band]=np.average(Data2D[ MaskData2 ==0 ])
    
    
plt.figure(figsize=(12,6))
plt.title('Features Spectral Signatures',fontsize=20)
plt.xlabel('Wavelength (nm)',fontsize=12)
plt.ylabel('Reflectance ',fontsize=12)
    
for i,type in enumerate(Objects_Label):
    Spectra=Avg_Spectra[i]
    plt.plot(Xvalues,Spectra,color=Objects_Color[i],label=Objects_Label[i])
plt.grid()
plt.legend(loc='upper left')       
Out[123]:
<matplotlib.legend.Legend at 0x7f9856796b80>

Problem 4: Generate your own mask(s) using what you have learned so far in terms of spectral signatures (and/or NDVI)¶

Going to generate a mask for only schrub and grassland based on this publication of NDVI value ranges https://www.researchgate.net/figure/Suitable-normalized-difference-vegetation-index-NDVI-ranges-identified-for-the-land_tbl1_330284135

In [126]:
# Mask Schrub and grassland (using NDVI band) 0.18-0.27
MaskSL= (NDVI > 0.18) & (NDVI < 0.27)
fig, figarr = plt.subplots(1, 2, figsize=(12,6))
fig.tight_layout()

figarr[0].set_title("Schrub and Grassland Mask using NDVI 0.18-0.27  range")
figarr[0].axis('off')
figarr[0].imshow(MaskSL)

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
figarr[1].axis('off')

figarr[1].set_title("NEON RGB - This a composite image made up of pieces")
figarr[1].imshow(RGBImage)
Out[126]:
<matplotlib.image.AxesImage at 0x7f9856f5bc70>

Problem 5: Combine more than one thresholds/tests to generate a robust more useful mask¶

Sort of did this above but now I'm going to see if I can do this just for urban sprawl (same source as above) with NDVI threshold of 0.015–0.14

In [140]:
# Mask urban sprawl (using NDVI band) 0.015–0.14
mmin=0.015
mmax=0.14
Name = 'Urban Sprawl'
Mask= (NDVI > 0.015) & (NDVI < 0.14)
fig, figarr = plt.subplots(1, 2, figsize=(12,6))
fig.tight_layout()

figarr[0].set_title(Name+' mask using NDVI '+str(mmin)+'-'+str(mmax)+' range')
figarr[0].axis('off')
figarr[0].imshow(Mask)

# Display RGB Image
RGBImage=vip.Image_getRGB(DataRED,DataGREEN,DataBLUE,8000)
# Display RGB True color Image
figarr[1].axis('off')

figarr[1].set_title("NEON RGB - This a composite image made up of pieces")
figarr[1].imshow(RGBImage)
Out[140]:
<matplotlib.image.AxesImage at 0x7f985772e340>

Problem 6: Describe how the mask changes the spectra?¶

In [142]:
#Using the predifined mask - 
# Mask values
# 1 Water
# 2 White roofs
# 3 Walnut trees
# 4 Forest
# 5 Grass
# 6 Road
# 0 everything else

CustomMask= (NDVI > 0.015) & (NDVI < 0.14)

Objects_Label = ['White Roofs','Road','Custom Urban Mask']
Objects_Color = ['Black','grey','red']
Avg_Spectra=np.zeros((3,426))

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[0,band]=np.average(Data2D[ MaskData2 ==2 ])

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[1,band]=np.average(Data2D[ MaskData2 ==6 ])

for band in range(0,nbands) :
    Data2D=DataCube[:,:,band]
    Avg_Spectra[2,band]=np.average(Data2D[CustomMask])
    
    
plt.figure(figsize=(12,6))
plt.title('Features Spectral Signatures',fontsize=20)
plt.xlabel('Wavelength (nm)',fontsize=12)
plt.ylabel('Reflectance ',fontsize=12)
    
for i,type in enumerate(Objects_Label):
    Spectra=Avg_Spectra[i]
    plt.plot(Xvalues,Spectra,color=Objects_Color[i],label=Objects_Label[i])
plt.grid()
plt.legend(loc='upper left')       
Out[142]:
<matplotlib.legend.Legend at 0x7f98576c0ac0>

The spectral signatures of my custom mask versus the other urban values is in between the two other spectral signatures which is exactly what I would expect since it is averaging across all urban land. Also, I would expect its reflectance to be much less than the white roofs themselves but since it includes the white roofs, I would expect it to be more than roads. Overall, I'm pretty pleased with this mask.

In [143]:
#display a message the program ended
print("program ended.")
program ended.
In [ ]: